Illusion Of Validity
   HOME

TheInfoList



OR:

Illusion of validity is a
cognitive bias A cognitive bias is a systematic pattern of deviation from norm or rationality in judgment. Individuals create their own "subjective reality" from their perception of the input. An individual's construction of reality, not the objective input, m ...
in which a person overestimates their ability to interpret and predict accurately the outcome when analyzing a set of data, in particular when the data analyzed show a very consistent pattern—that is, when the data "tell" a coherent story. This effect persists even when the person is aware of all the factors that limit the accuracy of their predictions, that is when the data and/or methods used to judge them lead to highly fallible predictions.
Daniel Kahneman Daniel Kahneman (; he, דניאל כהנמן; born March 5, 1934) is an Israeli-American psychologist and economist notable for his work on the psychology of judgment and decision-making, as well as behavioral economics, for which he was award ...
,
Paul Slovic Paul Slovic (born 1938 in Chicago) is a professor of psychology at the University of Oregon and the president oDecision Research Decision Research is a collection of scientists from all over the nation and in other countries that study decision- ...
, and Amos Tversky explain the illusion as follows: "people often predict by selecting the output...that is most representative of the input....The confidence they have in their prediction depends primarily on the degree of representativeness...with little or no regard for the factors that limit predictive accuracy. Thus, people express great confidence in the prediction that a person is a librarian when given a description of his personality which matches the stereotype of librarians, even if the description is scanty, unreliable, or outdated. The unwarranted confidence which is produced by a good fit between the predicted outcome and the input information may be called the illusion of validity." In one study, for example, subjects reported higher confidence in a prediction of the final
grade point average Grading in education is the process of applying standardized measurements for varying levels of achievements in a course. Grades can be assigned as letters (usually A through F), as a range (for example, 1 to 6), as a percentage, or as a numbe ...
of a student after seeing a first-year record of consistent ''B'''s than a first-year record of an even number of ''A'''s and ''C'''s. Consistent patterns may be observed when input variables are highly redundant or correlated, which may increase subjective confidence. However, a number of highly correlated inputs should not increase confidence much more than only one of the inputs; instead higher confidence should be merited when a number of highly ''independent'' inputs show a consistent pattern.


Discovery and description

This bias was first described by Amos Tversky and Daniel Kahneman in their 1973 paper " On the Psychology of Prediction". In a 2011 article, Kahneman recounted the story of his discovery of the illusion of validity. After completing an undergraduate psychology degree and spending a year as an infantry officer in the Israeli Army, he was assigned to the army's Psychology Branch, where he helped evaluate candidates for officer training using a test called the Leaderless Group Challenge. Candidates were taken to an obstacle field and assigned a group task so that Kahneman and his fellow evaluators could discern their individual leadership qualities or lack thereof. But although Kahneman and his colleagues emerged from the exercise with very clear judgments as to who was and wasn't a potential leader, their forecasts proved "largely useless" in the long term. Comparing their original evaluations of candidates with the judgments of their officer-training school commanders months later, Kahneman and his colleagues found that their own "ability to predict performance at the school was negligible. Our forecasts were better than blind guesses, but not by much." Yet when asked to again to assess yet another group of candidates, their judgments were as clear as before. "The dismal truth about the quality of our predictions," recalled Kahneman, "had no effect whatsoever on how we evaluated new candidates and very little effect on the confidence we had in our judgments and predictions." Kahneman found this striking: "The statistical evidence of our failure should have shaken our confidence in our judgments of particular candidates, but it did not. It should also have caused us to moderate our predictions, but it did not." Kahneman named this cognitive fallacy "the illusion of validity". Decades later, Kahneman reflected that at least part of the reason for his and his colleagues' failure in assessing the officer candidates was that they had been confronted with a difficult question but had instead, without realizing it, answered an easier one instead. "We were required to predict a soldier's performance in officer training and in combat, but we did so by evaluating his behavior over one hour in an artificial situation. This was a perfect instance of a general rule that I call WYSIATI, 'What you see is all there is.' We had made up a story from the little we knew but had no way to allow for what we did not know about the individual’s future, which was almost everything that would actually matter."


Other examples

Comparing the results of 25 wealth advisers over an eight-year period, Kahneman found that none of them stood out consistently as better or worse than the others. "The results," as he put it, "resembled what you would expect from a dice-rolling contest, not a game of skill." Yet at the firm for which all these advisers worked, no one seemed to be aware of this: "The advisers themselves felt they were competent professionals performing a task that was difficult but not impossible, and their superiors agreed." Kahneman informed the firm's directors that they were "rewarding luck as if it were skill." The directors believed this, yet "life in the firm went on just as before." The directors clung to the "illusion of skill," as did the advisers themselves. The scientist
Freeman Dyson Freeman John Dyson (15 December 1923 – 28 February 2020) was an English-American theoretical physicist and mathematician known for his works in quantum field theory, astrophysics, random matrices, mathematical formulation of quantum m ...
has recalled his experience as a British Army statistician during World War II, performing an analysis of the operations of the Bomber Command. At the time, an officer argued that because of the heavy gun turrets they carried, the bombers were too slow and could not fly high enough to avoid being shot down. He suggested they remove the turrets and gunners. But the commander in chief rejected the suggestion – because, said Dyson, "he was blinded by the illusion of validity." He was not alone: everyone in the command "saw every bomber crew as a tightly knit team of seven, with the gunners playing an essential role defending their comrades against fighter attack." Part of this illusion "was the belief that the team learned by experience. As they became more skillful and more closely bonded, their chances of survival would improve." Yet statistics, Dyson found, proved that all this was an illusion: deaths occurred randomly, having nothing to do with experience. Members of the bomber command, he realized, were dying unnecessarily because everyone was taken in by an illusion. In 2014, an article in ''
Rolling Stone ''Rolling Stone'' is an American monthly magazine that focuses on music, politics, and popular culture. It was founded in San Francisco, San Francisco, California, in 1967 by Jann Wenner, and the music critic Ralph J. Gleason. It was first kno ...
'' presented as fact an accusation of rape at the
University of Virginia The University of Virginia (UVA) is a Public university#United States, public research university in Charlottesville, Virginia. Founded in 1819 by Thomas Jefferson, the university is ranked among the top academic institutions in the United S ...
that proved to be false. Rolling Stone's writers and editors, the university president and other administrators, and many U.Va. students were quick to believe the false charges. Harlan Loeb later explained this as an example of the illusion of validity in action. In 2012, a sportswriter who described Kahneman as his "favorite scientist" wrote: "The illusion of validity is why I get deeply suspicious whenever a fan, sportswriter, coach, or GM says anything to the effect of 'the numbers don’t tell the whole story.' This is, in fact, true, but what the person saying this usually means is 'I don't care what the numbers say because I am convinced that what I have seen is correct.' Which is, thanks to this illusion, almost never true. If I make an argument that the data says a player isn't good, and someone points out 'Yes, but if you watch the games you will notice that this year they are only shooting threes from the slot, and rarely from the corner, where he used to excel,' then that person is pointing out a hole in the data that's worth investigating. If the argument is along the lines of 'anyone who's watching him can clearly see he's much better than that,' then I'm certain the illusion of validity is doing its dirty work." In a 1981 paper, J. B. Bushyhead and J. J. Christensen-Szalanski studied data from an outpatient clinic showing that doctors there ordered chest radiographs only on patients who manifested clinical attributes linked to some pneumonia cases, rather than on patients manifesting clinical attributes associated with all pneumonia cases. They attributed this behavior to the illusion of validity. Other cases where this phenomenon appears include job interviews, wine tasting, stock markets, political strategy.


Causes

The illusion of validity may be caused in part by
confirmation bias Confirmation bias is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values. People display this bias when they select information that supports their views, ignoring ...
and/or the
representativeness heuristic The representativeness heuristic is used when making judgments about the probability of an event under uncertainty. It is one of a group of heuristics (simple rules governing judgment or decision-making) proposed by psychologists Amos Tversky and D ...
, and could in turn cause the
overconfidence effect The overconfidence effect is a well-established bias in which a person's subjective ''confidence'' in his or her judgments is reliably greater than the objective ''accuracy'' of those judgments, especially when confidence is relatively high. Overco ...
. Among the factors contributing to the illusion of validity, according to Meinolf Dierkes, Ariane Berthoin Antal, John Child, and Ikujiro Nonaka, are "a person's tendency to register the frequency of events more than their probability"; "the impossibility of gathering information about alternative assumptions if action is based on a hypothesis"; a "disregard of base-rate information"; and "the self-fulfilling prophecy," or "a behavior manifested in individuals or groups because it was expected."


Overcoming the illusion

If one wishes to try to avoid being traduced by the illusion of validity, according to Kahneman, one should ask two questions: "Is the environment in which the judgment is made sufficiently regular to enable predictions from the available evidence? The answer is yes for diagnosticians, no for stock pickers. Do the professionals have an adequate opportunity to learn the cues and the regularities? The answer here depends on the professionals' experience and on the quality and speed with which they discover their mistakes." While many professionals "easily pass both tests," meaning that their "off-the-cuff judgments" are of value, in general judgments by "assertive and confident people" should be taken with a grain of salt "unless you have independent reason to believe that they know what they are talking about." This can be difficult, however, because "overconfident professionals...act as experts and look like experts," and it can be a "struggle to remind yourself that they may be in the grip of an illusion." In his article on the false rape case at the University of Virginia, Harlan Loeb outlined an approach to avoiding the illusion of validity in cases which, like that one, involve “a highly emotional and personal issue that has national resonance, high-profile media coverage and an organization already on the defensive with recent issues.” He advised, first: "Always challenge (appropriately, of course) facts and assumptions that many rely on to inform their thinking and decision-making about risk and crisis management." Second: "In situations with palpable unknowns, where the illusion of validity in decision-making is a material threat, push hard to do research, polling and active listening to help identify the levers and pulleys that shape the operating and environmental realities of the present risk." Third: "Determine existing organizational challenges that will prevent leadership from making decisions consistently, effectively, and quickly in the face of uncertainty." Fourth: "Be aware of how current actions could dictate future strategy." Fifth: "Be ready to take on current risk to manage future risk." Phil Thornton has offered the following advice for avoiding the illusion of validity in the financial sector. First, "remember that just because previous generations were successful following certain approaches, replicating their actions may not necessarily be a good idea." Second, "remember that the consequences of decisions being wrong can be more important than the probability of them being correct."


See also

*
List of cognitive biases Cognitive biases are systematic patterns of deviation from norm and/or rationality in judgment. They are often studied in psychology, sociology and behavioral economics. Although the reality of most of these biases is confirmed by reproducible ...


References


Bibliography

* * * * An adapted version of the cited section is available at: . *{{cite journal, last1=Tversky , first1=Amos , first2=Daniel , last2=Kahneman , title=Judgment under uncertainty: Heuristics and biases , journal=Science , year=1974 , volume=185 , issue=4157 , pages=1124–1131 , jstor=1738360 , doi=10.1126/science.185.4157.1124 , pmid=17835457 , bibcode=1974Sci...185.1124T , s2cid=143452957 Cognitive biases Decision analysis Decision theory